205 research outputs found

    Synthesising Strategy Improvement and Recursive Algorithms for Solving 2.5 Player Parity Games

    Get PDF
    2.5 player parity games combine the challenges posed by 2.5 player reachability games and the qualitative analysis of parity games. These two types of problems are best approached with different types of algorithms: strategy improvement algorithms for 2.5 player reachability games and recursive algorithms for the qualitative analysis of parity games. We present a method that - in contrast to existing techniques - tackles both aspects with the best suited approach and works exclusively on the 2.5 player game itself. The resulting technique is powerful enough to handle games with several million states

    Deciding Probabilistic Automata Weak Bisimulation in Polynomial Time

    Get PDF
    Deciding in an efficient way weak probabilistic bisimulation in the context of Probabilistic Automata is an open problem for about a decade. In this work we close this problem by proposing a procedure that checks in polynomial time the existence of a weak combined transition satisfying the step condition of the bisimulation. We also present several extensions of weak combined transitions, such as hyper-transitions and the new concepts of allowed weak combined and hyper-transitions and of equivalence matching, that turn out to be verifiable in polynomial time as well. These results set the ground for the development of more effective compositional analysis algorithms for probabilistic systems.Comment: Polished version with a more complete running example and typo fixe

    Lazy Probabilistic Model Checking without Determinisation

    Get PDF
    The bottleneck in the quantitative analysis of Markov chains and Markov decision processes against specifications given in LTL or as some form of nondeterministic B\"uchi automata is the inclusion of a determinisation step of the automaton under consideration. In this paper, we show that full determinisation can be avoided: subset and breakpoint constructions suffice. We have implemented our approach---both explicit and symbolic versions---in a prototype tool. Our experiments show that our prototype can compete with mature tools like PRISM.Comment: 38 pages. Updated version for introducing the following changes: - general improvement on paper presentation; - extension of the approach to avoid full determinisation; - added proofs for such an extension; - added case studies; - updated old case studies to reflect the added extensio

    On the Efficiency of Deciding Probabilistic Automata Weak Bisimulation

    Get PDF
    Weak probabilistic bisimulation on probabilistic automata can be decided by an algorithm that needs to check a polynomial number of linear programming problems encoding weak transitions. It is hence polynomial, but not guaranteed to be strongly polynomial. In this paper we show that for polynomial rational proba- bilistic automata strong polynomial complexity can be ensured. We further discuss complexity bounds for generic probabilistic automata. Then we consider several practical algorithms and LP transformations that enable an efficient solution for the concrete weak transition problem. This sets the ground for effective compositional minimisation approaches for probabilistic automata and Markov decision processes

    Hierarchical and compositional verification of cryptographic protocols

    Get PDF
    Nella verifica dei protocolli di sicurezza ci sono due importanti approcci che sono conosciuti sotto il nome di approccio simbolico e computazionale, rispettivamente. Nell'approccio simbolico i messaggi sono termini di un'algebra e le primitive crittografiche sono idealmente sicure; nell'approccio computazionale i messaggi sono sequenze di bit e le primitive crittografiche sono sicure con elevata probabilit\ue0. Questo significa, per esempio, che nell'approccio simbolico solo chi conosce la chiave di decifratura pu\uf2 decifrare un messaggio cifrato, mentre nell'approccio computazionale la probabilit\ue0 di decifrare un testo cifrato senza conoscere la chiave di decifratura \ue8 trascurabile. Di solito, i protocolli crittografici sono il risultato dell'interazione di molte componenti: alcune sono basate su primitive crittografiche, altre su altri principi. In generale, quello che risulta \ue8 un sistema complesso che vorremmo poter analizzare in modo modulare invece che doverlo studiare come un singolo sistema. Una situazione simile pu\uf2 essere trovata nel contesto dei sistemi distribuiti, dove ci sono molti componenti probabilistici che interagiscono tra loro implementando un algoritmo distribuito. In questo contesto l'analisi della correttezza di un sistema complesso \ue8 molto rigorosa ed \ue8 basata su strumenti che derivano dalla teoria dell'informazione, strumenti come il metodo di simulazione che permette di decomporre grossi problemi in problemi pi\uf9 piccoli e di verificare i sistemi in modo gerarchico e composizionale. Il metodo di simulazione consiste nello stabilire delle relazioni tra gli stati di due automi, chiamate relazioni di simulazione, e nel verificare che tali relazioni soddisfano delle condizioni di passo appropriate, come che ogni transizione del sistema simulato pu\uf2 essere imitata dal sistema simulante nel rispetto della relazione data. Usando un approccio composizionale possiamo studiare le propriet\ue0 di ogni singolo sotto-problema indipendentemente dagli altri per poi derivare le propriet\ue0 del sistema complessivo. Inoltre, la verifica gerarchica ci permette di definire molti raffinamenti intermedi tra la specifica e l'implementazione. Spesso la verifica gerarchica e composizionale \ue8 pi\uf9 semplice e chiara che l'intera verifica fatta in una volta sola. In questa tesi introduciamo una nuova relazione di simulazione, che chiamiamo simulazione polinomialmente accurata o simulazione approssimata, che \ue8 composizionale e che permette di usare l\u2019approccio gerarchico nelle nostre analisi. Le simulazioni polinomialmente accurate estendono le relazioni di simulazione definite nel contesto dei sistemi distribuiti sia nel caso forte sia in quello debole tenendo conto delle lunghezze delle esecuzioni e delle propriet\ue0 computazionali delle primitive crittografiche. Oltre alle simulazioni polinomialmente accurate, forniamo altri strumenti che possono semplificare l\u2019analisi dei protocolli crittografici: il primo \ue8 il concetto di automa condizionale che permette di rimuovere eventi che occorrono con probabilit\ue0 trascurabile in modo sicuro. Data una macchina che \ue8 attaccabile con probabilit\ue0 trascurabile, se costruiamo un automa che \ue8 condizionale all'assenza di questi attacchi, allora esiste una simulazione tra i due. Questo ci permette, tra l'altro, di lavorare con le relazioni di simulazione tutto il tempo e in particolare possiamo anche dimostrare in modo composizionale che l'eliminazione di eventi trascurabili \ue8 sicura. Questa propriet\ue0 \ue8 giustificata dal teorema dell\u2019automa condizionale che afferma che gli eventi sono trascurabili se e solo se la relazione identit\ue0 \ue8 una simulazione approssimata dall\u2019automa alla sua controparte condizionale. Un altro strumento \ue8 il teorema della corrispondenza delle esecuzioni, che estende quello del contesto dei sistemi distribuiti, che giustifica l\u2019approccio gerarchico. Infatti, il teorema afferma che se abbiamo molti automi e una catena di simulazioni tra di essi, allora con elevata probabilit\ue0 ogni esecuzione del primo automa della catena \ue8 in relazione con un\u2019esecuzione dell'ultimo automa della catena. In altre parole, abbiamo che la probabilit\ue0 che l'ultimo automa non sia in grado di simulare un\u2019esecuzione del primo \ue8 trascurabile. Infine, usiamo il framework delle simulazioni polinomialmente accurate per fornire delle famiglie di automi che implementano le primitive crittografiche comunemente usate e per dimostrare che l'approccio simbolico \ue8 corretto rispetto all\u2019approccio computazionale.Two important approaches to the verification of security protocols are known under the general names of symbolic and computational, respectively. In the symbolic approach messages are terms of an algebra and the cryptographic primitives are ideally secure; in the computational approach messages are bitstrings and the cryptographic primitives are secure with overwhelming probability. This means, for example, that in the symbolic approach only who knows the decryption key can decrypt a ciphertext, while in the computational approach the probability to decrypt a ciphertext without knowing the decryption key is negligible. Usually, the cryptographic protocols are the outcome of the interaction of several components: some of them are based on cryptographic primitives, other components on other principles. In general, the result is a complex system that we would like to analyse in a modular way instead of studying it as a single system. A similar situation can be found in the context of distributed systems, where there are several probabilistic components that interact with each other implementing a distributed algorithm. In this context, the analysis of the correctness of a complex system is very rigorous and it is based on tools from information theory such as the simulation method that allows us to decompose large problems into smaller problems and to verify systems hierarchically and compositionally. The simulation method consists of establishing relations between the states of two automata, called simulation relations, and to verify that such relations satisfy appropriate step conditions: each transition of the simulated system can be matched by the simulating system up to the given relation. Using a compositional approach we can study the properties of each small problem independently from the each other, deriving the properties of the overall system. Furthermore, the hierarchical verification allows us to build several intermediate refinements between specification and implementation. Often hierarchical and compositional verification is simpler and cleaner than direct one-step verification, since each refinement may focus on specific homogeneous aspects of the implementation. In this thesis we introduce a new simulation relation, that we call polynomially accurate simulation, or approximated simulation, that is compositional and that allows us to adopt the hierarchical approach in our analyses. The polynomially accurate simulations extend the simulation relations of the distributed systems context in both strong and weak cases taking into account the lengths of the computations and of the computational properties of the cryptographic primitives. Besides the polynomially accurate simulations, we provide other tools that can simplify the analysis of cryptographic protocols: the first one is the concept of conditional automaton, that permits to safely remove events that occur with negligible probability. Starting from a machine that is attackable with negligible probability, if we build an automaton that is conditional to the absence of these attacks, then there exists a simulation. And this allows us to work with the simulation relations all the time and in particular we can also prove in a compositional way that the elimination of negligible events from an automaton is safe. This property is justified by the conditional automaton theorem that states that events are negligible if and only if the identity relation is an approximated simulation from the automaton and its conditional counterpart. Another tool is the execution correspondence theorem, that extends the one of the distributed systems context, that allows us to use the hierarchical approach. In fact, the theorem states that if we have several automata and a chain of simulations between them, then with overwhelming probability each execution of the first automaton is related to an execution of the last automaton. In other words, we have that the probability that the last automaton is not able to simulate an execution of the first one is negligible. Finally, we use the polynomially accurate simulation framework to provide families of automata that implement commonly used cryptographic primitives and to prove that the symbolic approach is sound with respect to the computational approach

    Learning to Complement Buchi Automata

    Get PDF

    Assessment of Flame Transfer Function Formulations for the Thermoacoustic Analysis of Lean Burn Aero-Engine Combustors

    Get PDF
    Abstract The numerical analysis of thermoacoustic instability in lean burn aero-engines requires proper Flame Transfer Functions (FTF) able to describe the complex physical phenomena characterizing the coupling between heat release rate fluctuations and the acoustic field which is further complicated by the use of liquid fuel together with advanced injection systems. In this work simple FTF formulations have been applied to the thermoacoustic analysis of a tubular combustor equipped with a PERM (Partially Evaporating and Rapid Mixing) injection system with the main aim of assessing their capabilities in the prediction of thermoacoustic instabilities in lean burn aero-engine combustors
    • …
    corecore